skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.
Attention:The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 7:00 AM ET to 7:30 AM ET on Friday, April 24 due to maintenance. We apologize for the inconvenience.


Search for: All records

Creators/Authors contains: "Estrin, Deborah"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. BackgroundAlthough family caregivers play a critical role in care delivery, research has shown that they face significant physical, emotional, and informational challenges. One promising avenue to address some of caregivers’ unmet needs is via the design of digital technologies that support caregivers’ complex portfolio of responsibilities. Augmented reality (AR) applications, specifically, offer new affordances to aid caregivers as they perform care tasks in the home. ObjectiveThis study explored how AR might assist family caregivers with the delivery of home-based cancer care. The specific objectives were to shed light on challenges caregivers face where AR might help, investigate opportunities for AR to support caregivers, and understand the risks of AR exacerbating caregiver burdens. MethodsWe conducted a qualitative video elicitation study with clinicians and caregivers. We created 3 video elicitations that offer ways in which AR might support caregivers as they perform often high-stakes, unfamiliar, and anxiety-inducing tasks in postsurgical cancer care: wound care, drain care, and rehabilitative exercise. The elicitations show functional AR applications built using Unity Technologies software and Microsoft Hololens2. Using elicitations enabled us to avoid rediscovering known usability issues with current AR technologies, allowing us to focus on high-level, substantive feedback on potential future roles for AR in caregiving. Moreover, it enabled nonintrusive exploration of the inherently sensitive in-home cancer care context. ResultsWe recruited 22 participants for our study: 15 clinicians (eg, oncologists and nurses) and 7 family caregivers. Our findings shed light on clinicians’ and caregivers’ perceptions of current information and communication challenges caregivers face as they perform important physical care tasks as part of cancer treatment plans. Most significant was the need to provide better and ongoing support for execution of caregiving tasks in situ, when and where the tasks need to be performed. Such support needs to be tailored to the specific needs of the patient, to the stress-impaired capacities of the caregiver, and to the time-constrained communication availability of clinicians. We uncover opportunities for AR technologies to potentially increase caregiver confidence and reduce anxiety by supporting the capture and review of images and videos and by improving communication with clinicians. However, our findings also suggest ways in which, if not deployed carefully, AR technologies might exacerbate caregivers’ already significant burdens. ConclusionsThese findings can inform both the design of future AR devices, software, and applications and the design of caregiver support interventions based on already available technology and processes. Our study suggests that AR technologies and the affordances they provide (eg, tailored support, enhanced monitoring and task accuracy, and improved communications) should be considered as a part of an integrated care journey involving multiple stakeholders, changing information needs, and different communication channels that blend in-person and internet-based synchronous and asynchronous care, illness, and recovery. 
    more » « less
  2. Digital biomarkers of mental health, created using data extracted from everyday technologies including smartphones, wearable devices, social media and computer interactions, have the opportunity to revolutionise mental health diagnosis and treatment by providing near-continuous unobtrusive and remote measures of behaviours associated with mental health symptoms. Machine learning models process data traces from these technologies to identify digital biomarkers. In this editorial, we caution clinicians against using digital biomarkers in practice until models are assessed for equitable predictions (‘model equity’) across demographically diverse patients at scale, behaviours over time, and data types extracted from different devices and platforms. We posit that it will be difficult for any individual clinic or large-scale study to assess and ensure model equity and alternatively call for the creation of a repository of open de-identified data for digital biomarker development. 
    more » « less
  3. null (Ed.)
    Background Mobile health technology has demonstrated the ability of smartphone apps and sensors to collect data pertaining to patient activity, behavior, and cognition. It also offers the opportunity to understand how everyday passive mobile metrics such as battery life and screen time relate to mental health outcomes through continuous sensing. Impulsivity is an underlying factor in numerous physical and mental health problems. However, few studies have been designed to help us understand how mobile sensors and self-report data can improve our understanding of impulsive behavior. Objective The objective of this study was to explore the feasibility of using mobile sensor data to detect and monitor self-reported state impulsivity and impulsive behavior passively via a cross-platform mobile sensing application. Methods We enrolled 26 participants who were part of a larger study of impulsivity to take part in a real-world, continuous mobile sensing study over 21 days on both Apple operating system (iOS) and Android platforms. The mobile sensing system (mPulse) collected data from call logs, battery charging, and screen checking. To validate the model, we used mobile sensing features to predict common self-reported impulsivity traits, objective mobile behavioral and cognitive measures, and ecological momentary assessment (EMA) of state impulsivity and constructs related to impulsive behavior (ie, risk-taking, attention, and affect). Results Overall, the findings suggested that passive measures of mobile phone use such as call logs, battery charging, and screen checking can predict different facets of trait and state impulsivity and impulsive behavior. For impulsivity traits, the models significantly explained variance in sensation seeking, planning, and lack of perseverance traits but failed to explain motor, urgency, lack of premeditation, and attention traits. Passive sensing features from call logs, battery charging, and screen checking were particularly useful in explaining and predicting trait-based sensation seeking. On a daily level, the model successfully predicted objective behavioral measures such as present bias in delay discounting tasks, commission and omission errors in a cognitive attention task, and total gains in a risk-taking task. Our models also predicted daily EMA questions on positivity, stress, productivity, healthiness, and emotion and affect. Perhaps most intriguingly, the model failed to predict daily EMA designed to measure previous-day impulsivity using face-valid questions. Conclusions The study demonstrated the potential for developing trait and state impulsivity phenotypes and detecting impulsive behavior from everyday mobile phone sensors. Limitations of the current research and suggestions for building more precise passive sensing models are discussed. Trial Registration ClinicalTrials.gov NCT03006653; https://clinicaltrials.gov/ct2/show/NCT03006653 
    more » « less